Outcomes from ICML+EAHIL2017 workshop

At ICML+EAHIL2017 we facilitated a workshop called Cooperation and benchmarking – finding the value and impact together.

We invited the participants to take part in our benchmarking project. We wanted them to help us to identify more future oriented indicators* and also to discuss how — or if — benchmarking can provide tools for creating an evidence base for health librarianship. The goal of the workshop was to find some new and exciting ideas to take further. We used brainwriting as a tool to find and refine the ideas.

working

Preparations the day before.

The first part of the session was spent on identifying new ideas for indicators to help measuring impact and value for international (health) library benchmarking.

The best ideas for indicators — quotations from post-it notes — from this brainwriting sessions were:

  • Number of high “grade” student essays/exam papers in relation to librarian time spent teaching/tutoring
  • How has the literature search been used to change practice?
  • Impact on national health policies index/indicator
  • When host organisation cites the library’s contribution in press releases or publicity
  • What is the new role of a librarian? Non-traditional work
  • Publications from the faculty; visibility in altmetrics
  • Can the customer get the grant he/she applies
  • Time saved by faculty e.g. lecture writing, student remediation
  • Proportion of knowledge syntheses that reach publication
  • Increase in application usage after a conference
  • Chocolate/biscuits/cards — how many gifts (you get from customers)
interactive workshop

Identifying new indicators

During the second part of the session the participants discussed how (or if) benchmarking can provide tools for creating an evidence base for health librarianship. There were five questions and the participants came up with lots of ideas and then voted for the ones that they liked best. Here are the ideas and proposals that were most popular — again quotations from post-it notes:

1. How can benchmarking provide tools for creating an evidence base for health librarianship?

  • Develop indicators which are clearly articulated and detailed to enable consistent application across different services.

2. How can cooperation and benchmarking be seen as research activities?

  • Will help develop “industry standards” to be adopted by the profession.

3. How can cooperation and benchmarking be used for measuring the impact of libraries and librarians?

  • It will give you “evidence” by comparing both qualitative and quantitative measures.

4. How to inspire staff for change?

  • By empowering them, trusting them, and giving them freeway to make decisions.

5. Something else?

  • More national and international collaboration.
WOWs

The best ideas

We are very grateful for all the participants for working hard and being so active. We hope everyone got something to bring back, some food for thought, found new connections and were able to extend their professional network.

* ISO 11620 (2014) definition of indicator: Expression used to characterize activities both in quantitative and qualitative terms in order to assess the value of the activities characterized, and the associated method.

Advertisements

University rankings and publication figures — Is there something to compare from the library’s point of view?

What really is in the interest of the stakeholders and library directors is to find a way to compare the library’s impact on the success of the university, and if and how the investments of the library — acquisitions and services, staff and collections, equipment and space — affect the success of the university for instance in the different international higher education rankings.

The background information shows some ranking from the years 2011 and 2012. Here are the CWTS Leiden ranking 2015 showing each of our university’s size-independent ranking in biomedical and health sciences in comparison to the other universities in the same country — not in comparison to each other as we have realized that the funding systems and the organizations make direct comparisons difficult and hard to explain.

See CWTS Leiden Ranking 2015 for the explanations of the impact and collaboration factors. The first two pictures show collaboration and impact figures of Belgian universities in biomedical and health sciences, the next two the same figures of Finnish universities, and the last two the same of Norwegian universities.


Did our libraries have any impact on these? We would really like to find collaborators with statistical analysis skills to e.g. dig deep into our biomedical and health sciences electronic collections numbers of usage, to help in finding our if investing in them is worth the money in each university in relation to the figures that the different ranking lists provide.

SciVal is a set of integrated modules that enables institutions to make evidence-based strategic decisions. SciVal consists of three modules: Overview, benchmarking and collaboration. Does using them tell us anything about the libraries’ impact? Or, when we compare the figures, do we see something the library can do better?

The overall pictures of our universities look like this:

 

 

 

 

 

 

We can compare the scholarly output:

The can compare the scholarly output also in the top 10 publications category:

The figures are interesting but how much impact the library and its services has on these figures? How could we know that?

Our next step will be to explore qualitative indicators regarding the libraries’ impact on publications and research.