How to find time for benchmarking or other cooperation?

Taking part in library development and projects should be a natural activity for any library staff member. The great challenge is to find time to dedicate to such important tasks. A regular week in the library consists of lots of planned activities, meetings, events, things that just appear and must be solved on the spot, and then there is not any time left. It is easy to ignore tasks not visible in the plan of action or calendar. These tasks often seem “less important” and ends up out of sight, out of mind — regardless of how exciting and useful the project/task is. This happens both to in-house projects and international cooperation, and is also our experience in the benchmarking project –- it is difficult to find time.

Our project started in 2013; the initiative came from the library director of the University of Eastern Finland; the other two library directors supported the project mainly by agreeing on their staff spending their time. None of us has a budget or dedicated time for this project, we have had our normal tasks all the time – except during the site visits and some face-to-face meetings. We have kept costs, and use of time to a minimum as we mainly work online. The funding sources for the visits came from Erasmus staff exchange program and from the libraries’ budgets.

20170131blogpost

Karen and Ghislaine taking a look at a handout in Brussels.

The project has clear goals and a project plan that give direction and deadlines. Our project is a best-practice benchmarking that aims at improving services; it is a process much more than a traditional project. The work is loosely organised; there is no leader – or we are all leaders. The three of us are equal in all decisions, our roles are based on our personalities and competencies, and so it must presumably be in a project of this character.

So how much time have we spent?  Since January 2014 (the main project period) we have used roughly 5 % of our total work time each:

  • Library visits: 3 weeks
  • Work together at EAHIL meetings: 3 days
  • Skype monthly meetings and preparations: 3 weeks
  • Planning the workshop for Edinburgh: 1 week
  • Planning the presentation and writing the full-text article for Seville: 1 week

These scheduled activities are roughly 8,5 weeks for each of us – out of the 156 weeks in the 3-year periods. The problem is to find time for individual activities like reading and preparing between our meetings.

 

20170131blogpost3

Tuulevi writing in Kuopio

 

Collaboration tools have been important to be able to spend time effectively both during and between meetings. The most useful tools we’ve used for cooperation have been these:

  • Dropbox for all kinds of data: meeting agendas and minutes, collected data, plans, photos and so on
  • Google Hangouts for online meetings and collaborative writing
  • WordPress blog for communicating our results

The blog was originally intended to document library visits in Trondheim, Kuopio and Brussels, but has become an important help to keep focus and progress. We use the blog as a planning tool, where each blog post explores and describes new topics, with deadlines and a responsible person. Parts of online meetings are used to finalise and publish new blog posts.

20170131blogpost2

Ghislaine and Karen with Frédéric Brodkom in Louvain-la-Neuve

To work on an international project, with limited resources, is challenging but also rewarding. It requires self-discipline to allocate time, but also support and understanding from colleagues and supervisors. It is important not to be frustrated from insufficient time, or when meetings and deadlines have to be postponed.

We have learned a lot from the project, from working together and from sharing with other EAHIL colleagues. So far the benchmarking project has been a continuing process of evaluation and development of the libraries’ functions and staff competencies as well as learning about different ways of managing a library.

Advertisements

Choosing ISO indicators

As consideration of ISO (International Organization for Standardization) performance indicators seemed to make sense in a library benchmarking project, we decided to pick up a couple of them out of ISO 11620 (2014).

  1. The first step was to read it completely and theoretically decide which could bring useful information.
  2. The second step consisted using actual data from our libraries.
  3. The third one is to use them to produce information.

ISO Indicators chosen and discussed

User per capita indicator stresses the importance of the library as a place for study, meeting, and as a learning centre, and indicates the institution’s support for these tasks. We decided to consider students only, including PhD, for this indicator as they are the most actual users of the physical library.

Staff per capita is supposed assessing the number of library employees per 1 000 members of the population to be served. The amount of work to be done can be considered proportional to the number of persons in the population to be served. We decided to consider students, including PhD, for this indicator as they are the most actual users of the physical library + academic staff from faculties and hospitals. Hospital nursing staff actually uses (physically or not) the library, yet less than academics in our opinion. We hence agreed on adding here 10 percent of this personnel.

The Number of User Attendances at Training Lessons per Capita can be used to assess the success of the library in reaching its users through the provision of training lessons. As this performance indicator is applicable to all libraries with a defined population to be served, and this number is impossible to define, we decided not to consider it.

User Services Staff as a Percentage of Total Staff indicator can be used to determine the library’s effort devoted to public services in relation to the background services. User services include the following functions: lending, reference interlibrary lending, user education, photocopying, shelving, and retrieving items. We decided to use it.

We now have to use our results, interpret them and find recommendations. This will probably be communicated in a paper or a conference presentation in the next months.

University rankings and publication figures — Is there something to compare from the library’s point of view?

What really is in the interest of the stakeholders and library directors is to find a way to compare the library’s impact on the success of the university, and if and how the investments of the library — acquisitions and services, staff and collections, equipment and space — affect the success of the university for instance in the different international higher education rankings.

The background information shows some ranking from the years 2011 and 2012. Here are the CWTS Leiden ranking 2015 showing each of our university’s size-independent ranking in biomedical and health sciences in comparison to the other universities in the same country — not in comparison to each other as we have realized that the funding systems and the organizations make direct comparisons difficult and hard to explain.

See CWTS Leiden Ranking 2015 for the explanations of the impact and collaboration factors. The first two pictures show collaboration and impact figures of Belgian universities in biomedical and health sciences, the next two the same figures of Finnish universities, and the last two the same of Norwegian universities.


Did our libraries have any impact on these? We would really like to find collaborators with statistical analysis skills to e.g. dig deep into our biomedical and health sciences electronic collections numbers of usage, to help in finding our if investing in them is worth the money in each university in relation to the figures that the different ranking lists provide.

SciVal is a set of integrated modules that enables institutions to make evidence-based strategic decisions. SciVal consists of three modules: Overview, benchmarking and collaboration. Does using them tell us anything about the libraries’ impact? Or, when we compare the figures, do we see something the library can do better?

The overall pictures of our universities look like this:

 

 

 

 

 

 

We can compare the scholarly output:

The can compare the scholarly output also in the top 10 publications category:

The figures are interesting but how much impact the library and its services has on these figures? How could we know that?

Our next step will be to explore qualitative indicators regarding the libraries’ impact on publications and research.