For comparison of usage figures, we have compared our lists of the available databases in our universities. While comparing them we have realised that our lists of available databases are very different. It is rather surprising to realise that three European health sciences libraries are not very similar even from this point of view.
There are many databases that two of us have but one does not, and this differs between all the three libraries so that it can be any of the three that does not have a database the two others do have. There are more than the two we have chosen, that we all provide access to, but the usage of all databases is not available or comparable for various reasons. For example, PubMed, naturally, is widely used in all the three universities but, as a freely accessible interface, it cannot provide this kind of information for us.
The databases we have chosen are SciFinder and Scopus.
- SciFinder is an information retrieval system for searching several databases about chemistry and related areas. It includes CAPLUS: literature reference database which contains references to more than 9000 journals on chemistry and related subjects, patents, technical reports, books, conference proceedings, and academic dissertations; CAS REGISTRY: factual database of over 50 million chemical substances (e.g. trade names, chemical names, chemical structures); CASREACT: contains 8 million single-phase and multiphase chemical reactions; CHEMCATS: trade catalogue of 7 million chemicals; CHEMLIST: catalogue of 232 000 regulated chemicals; MEDLINE: biomedical journal article reference database.
- Scopus is a large abstract and citation database of research literature and quality web sources. Scopus offers nearly 19,000 peer-reviewed journals from more than 5,000 publishers. It contains also EMBASE and MEDLINE databases. It includes also book series and conference papers. The number of records is 46 million. Moreover, Scopus covers 433 million quality web sources, including 24 million patents.
This is how the figures of usage (number of searches) look for the year 2015:
|SciFinder Searches||28 146||25 847||9 842|
|Scopus Searches||131 440||182 384||68 752|
For the year 2016 they look like this:
|SciFinder Searches||21 097||25 090||9 473|
|Scopus Searches||143 272||722 761||85 167|
How does this compare to the number of potential users? In all, NTNU had about 22 000 students and a staff of 6 700, while UCL had 24 000 students and a staff of 5 840, and finally, UEF — being half the size of the other two — had approximately 11 000 students and a staff of 2 800. Not all of them are potential users of the rather specific SciFinder but they all are potential users of the multi-disciplinary Scopus. So, here is a comparison of Scopus usage in 2016 per student and per staff member in each university:
|Scopus Searches per potential user||5||24||5|
The difference is significant. What could it mean? Should NTNU and UEF start marketing Scopus more? Probably. Or is there an alternative choice on their list of databases that UCL does not have?
The Web of Science (WoS) is the one that comes into mind as another multi-disciplinary database. What do the figures of WoS usage show?
|Web of Science Searches||151 608||65 640|
|Web of Science Searches per potential user||5||5|
In NTNU and UEF WoS is probably one reason for the lower usage of Scopus but counting together Scopus and WoS usage per user in NTNU and UEF does not come even close to the figure of only Scopus usage in UCL. It is difficult to know what is behind these figures.
At UCL, but also NTNU and UEF, students are taught to use Scopus during library trainings. Moreover, research evaluation (for international, national, regional and university funding) and individual academic promotion are largely based on bibliometrics, when francophone universities in Belgium do not offer WoS. (Neither does the Belgian scientific fund, FNRS-FRS).
If we could publish the information about on how much each of the universities pays for these databases, we might be able to say something about value for money, but at the moment that is not possible. Maybe the paradigm of open science will progress to that direction, too, at least if the funders are the tax-payers. Once again, comparing and benchmarking internationally is not easy at all. It is complicated and time-demanding but also interesting and thought-provoking.