Passing on the benchmarking baton — welcome to our workshop in Cardiff

We are delighted and honoured to see that the interactive workshop we will facilitate on Wednesday 11th July from 14:00 to 15:30 in Studio 2  in Cardiff is already fully booked. Maybe it is not so surprising when we only accept 20 workshop attendees because the required level of participant activity is very high.

The title — and topic of the workshop — is Passing on the benchmarking baton: workshop on cooperation methods, using new indicators, finding partners, and reporting results. The methods will include speed-dating, brain-storming and brain-writing.

Our workshop aims to

  1. Share methods and tools;
  2. Encourage cooperation and new partnerships between libraries and librarians;
  3. Build on new indicators that were identified during the Dublin workshop;
  4. Identify themes and methods for new benchmarking projects;
  5. Find methods and channels to report to colleagues.

As a participant, you will have the opportunity to meet new partners for future benchmarking projects and to learn and practise tools to use in your workshops.

Why do we want to pass on the benchmarking baton?

For about five years, we have collected and tried to compare plenty of data and statistics, had dozens of online meetings and a few live meetings, made site visits to each of the participating libraries, interviewed library users, interviewed library staff, discussed  library as a place, marketing, information skills training, and many other aspects of our work. We have also maintained this blog where we have shared experiences and thoughts on benchmarking issues.

We have also had a focus group session in EAHIL+ICAHIS+ICML 2015 workshop in Edinburgh, prepared and presented a paper at EAHIL 2016 conference in Seville, moreover, facilitated an interactive workshop session in ICML+EAHIL 2017 in Dublin. Most importantly, we have learned a lot.

Now it is not only finally the time to pass on the benchmarking baton to you, but we also need to move on — to new projects and roles, to different challenges and experiences.

Part of a workshop planning document

Part of a workshop planning document

So, what we are proposing, is that you start benchmarking! Our benchmarking project brought us to compare different ways of organising library area and services, of managing staff and coordinate relationships within the institution and outside, and how to train users to information literacy and establish and maintain connections with faculty and hospital. Current and traditional statistics did not help us. They could be compared but did not provide us with handy information, partly because they do not cover the same reality (e.g. economics, mostly). ISO indicators were difficult to use because our countries, working cultures and usages are different.

We came to the point where we wanted to investigate the value of the library, and what we needed was new indicators to compare this value. Impact of our project was that EAHIL members took part in our benchmarking and collaborated in our workshops to propose these new indicators, which could better address our goals. We hope this last workshop will help you to set your goals and encourage you to start something as rewarding as our project.

If you have not attended our previous workshop (or participated in the focus group or heard the presentation) or read this blog before, please, take a look at the background, the participating libraries, and the About page, in addition to reading at least some of the linked posts — most of them are shorter than this one.

We would also really appreciate it, if you could shortly comment who you are, where you work and/or study, and why you have chosen our workshop.

See you in Cardiff!


Statistics of usage — Can the figures of 2 or 3 databases be useful in benchmarking?

For comparison of usage figures, we have compared our lists of the available databases in our universities. While comparing them we have realised that our lists of available databases are very different. It is rather surprising to realise that three European health sciences libraries are not very similar even from this point of view.

There are many databases that two of us have but one does not, and this differs between all the three libraries so that it can be any of the three that does not have a database the two others do have. There are more than the two we have chosen, that we all provide access to, but the usage of all databases is not available or comparable for various reasons. For example, PubMed, naturally, is widely used in all the three universities but, as a freely accessible interface, it cannot provide this kind of information for us.


Students using databases at UCL

The databases we have chosen are SciFinder and Scopus.

  • SciFinder is an information retrieval system for searching several databases about chemistry and related areas. It includes CAPLUS: literature reference database which contains references to more than 9000 journals on chemistry and related subjects, patents, technical reports, books, conference proceedings, and academic dissertations; CAS REGISTRY: factual database of over 50 million chemical substances (e.g. trade names, chemical names, chemical structures); CASREACT: contains 8 million single-phase and multiphase chemical reactions; CHEMCATS: trade catalogue of 7 million chemicals; CHEMLIST: catalogue of 232 000 regulated chemicals; MEDLINE: biomedical journal article reference database.
  • Scopus is a large abstract and citation database of research literature and quality web sources. Scopus offers nearly 19,000 peer-reviewed journals from more than 5,000 publishers. It contains also EMBASE and MEDLINE databases. It includes also book series and conference papers. The number of records is 46 million. Moreover, Scopus covers 433 million quality web sources, including 24 million patents.

This is how the figures of usage (number of searches) look for the year 2015:

SciFinder Searches   28 146   25 847   9 842
Scopus Searches 131 440 182 384 68 752

For the year 2016  they look like this:

SciFinder Searches    21 097   25 090    9 473
Scopus Searches  143 272 722 761  85 167

How does this compare to the number of potential users? In all, NTNU had about 22 000 students and a staff of 6 700, while UCL had 24 000 students and a staff of 5 840, and finally, UEF — being half the size of the other two — had approximately 11 000 students and a staff of 2 800. Not all of them are potential users of the rather specific SciFinder but they all are potential users of the multi-disciplinary Scopus. So, here is a comparison of Scopus usage in 2016 per student and per staff member in each university:

Scopus Searches per potential user  5 24  5

The difference is significant. What could it mean? Should NTNU and UEF start marketing Scopus more? Probably. Or is there an alternative choice on their list of databases that UCL does not have?

The Web of Science (WoS) is the one that comes into mind as another multi-disciplinary database. What do the figures of WoS usage show?

Web of Science Searches  151 608  65 640
Web of Science Searches per potential user   5  5

In NTNU and UEF WoS is probably one reason for the lower usage of Scopus but counting together Scopus and WoS usage per user in NTNU and UEF does not come even close to the figure of only Scopus usage in UCL. It is difficult to know what is behind these figures.

At UCL, but also NTNU and UEF, students are taught to use Scopus during library trainings. Moreover, research evaluation (for international, national, regional and university funding) and individual academic promotion are largely based on bibliometrics, when francophone universities in Belgium do not offer WoS. (Neither does the Belgian scientific fund, FNRS-FRS).

If we could publish the information about on how much each of the universities pays for these databases, we might be able to say something about value for money, but at the moment that is not possible. Maybe the paradigm of open science will progress to that direction, too, at least if the funders are the tax-payers. Once again, comparing and benchmarking internationally is not easy at all. It is complicated and time-demanding but also interesting and thought-provoking.


Happy benchmarkers at NTNU main library building

Outcomes from ICML+EAHIL2017 workshop

At ICML+EAHIL2017 we facilitated a workshop called Cooperation and benchmarking – finding the value and impact together.

We invited the participants to take part in our benchmarking project. We wanted them to help us to identify more future oriented indicators* and also to discuss how — or if — benchmarking can provide tools for creating an evidence base for health librarianship. The goal of the workshop was to find some new and exciting ideas to take further. We used brainwriting as a tool to find and refine the ideas.


Preparations the day before.

The first part of the session was spent on identifying new ideas for indicators to help measuring impact and value for international (health) library benchmarking.

The best ideas for indicators — quotations from post-it notes — from this brainwriting sessions were:

  • Number of high “grade” student essays/exam papers in relation to librarian time spent teaching/tutoring
  • How has the literature search been used to change practice?
  • Impact on national health policies index/indicator
  • When host organisation cites the library’s contribution in press releases or publicity
  • What is the new role of a librarian? Non-traditional work
  • Publications from the faculty; visibility in altmetrics
  • Can the customer get the grant he/she applies
  • Time saved by faculty e.g. lecture writing, student remediation
  • Proportion of knowledge syntheses that reach publication
  • Increase in application usage after a conference
  • Chocolate/biscuits/cards — how many gifts (you get from customers)
interactive workshop

Identifying new indicators

During the second part of the session the participants discussed how (or if) benchmarking can provide tools for creating an evidence base for health librarianship. There were five questions and the participants came up with lots of ideas and then voted for the ones that they liked best. Here are the ideas and proposals that were most popular — again quotations from post-it notes:

1. How can benchmarking provide tools for creating an evidence base for health librarianship?

  • Develop indicators which are clearly articulated and detailed to enable consistent application across different services.

2. How can cooperation and benchmarking be seen as research activities?

  • Will help develop “industry standards” to be adopted by the profession.

3. How can cooperation and benchmarking be used for measuring the impact of libraries and librarians?

  • It will give you “evidence” by comparing both qualitative and quantitative measures.

4. How to inspire staff for change?

  • By empowering them, trusting them, and giving them freeway to make decisions.

5. Something else?

  • More national and international collaboration.

The best ideas

We are very grateful for all the participants for working hard and being so active. We hope everyone got something to bring back, some food for thought, found new connections and were able to extend their professional network.

* ISO 11620 (2014) definition of indicator: Expression used to characterize activities both in quantitative and qualitative terms in order to assess the value of the activities characterized, and the associated method.