Category Archives: Benefits

Library analytics bibliography

With thanks to Diane Costello (Executive Officer, CAUL – Council of Australian University Librarians) and Judy Luther  (www.informedstrategies.com) for the suggestion, we have put together a Library analytics bibliography page based on articles we have consulted as part of LIDP.

There is also an excellent new set of resources at CAUL including bibliographies on return on investment and value of libraries; the value and impact of university libraries and library data & text mining.

We would love to here from you if you have anymore suggestions for our list of resources

A spot of cross fertilization!

We’ve spent an interesting week talking to other JISC projects.

We’ll be working very closely with the Copac Activity Data Project over the next 6 months and had a good meeting with them on Tuesday.

CopacAD will conduct primary research to investigate the following additional use cases:

  • an undergraduate from a teaching and learning institution searching for course related materials
  • academics/teachers using the recommender to support the development of course reading lists
  • librarians using the recommendations to support academics/lecturers and collections development

Huddersfield will be one of the libraries providing data and we’ll also be participating in the focus groups.

We are both undertaking work packages around business case feasibility studies and hope to pool our activity by sending out a joint questionnaire later in the project. We will also be participating in a RLUK/SCONUL workshop in April.

Yesterday the project happened upon another JISC project at Huddersfield, the JISC EBEAM Project.

EBEAM will evaluate the impact of e-assessment and feedback on student satisfaction, retention, progression and attainment as well as on institutional efficiency.

EBEAM is looking at GradeMark and we think there is a real opportunity to link this into the LIDP project in the future.

Watch this space for more developments with both projects over the coming months.

Announcing Phase 2 of the Library Impact Data Project

In November 2011 the University of Huddersfield was approached by JISC to submit a proposal for an extension to the original project.

We are very pleased to announce that in late December 2011 funding was approved to take the proposal forward into phase II of the project, which will run from January 2012 to July 2012. The aim of phase II is to build upon the work carried out in phase I and will cover 6 aims, which will further exploit the data, investigate possible causal aspects and disseminate findings from both phases as follows:

  • Addition of other relevant data such as UCAS points, demographics, retention data etc.
  • To study the impact of in house projects
  • To use the enriched data to provide better management information
  • Investigate three case studies of courses exhibiting non/low usage of library resources
  • To conduct a feasibility study on the viability of a JISC shared service that involves collection and analysis of library impact data for all UK HE libraries, including a workshop with SCONUL and RLUK to discuss opportunities with usage data and possibilities for shared services
  • To build on the phase 1 toolkit by offering a number of training courses and podcasts aimed at other librarians in UK HE

For further information please see our project proposal and watch out for more posts over the next 7 months.

The Final Blog Post

It has been a short but extremely productive 6 months for the Library Impact Data Project Team. Before we report on what we have done and look to the future, we have to say a huge thank you to our partners. We thought we would be taking a lot on at the start of the project in getting eight universities to partner in a six month project; however, it has all gone extremely smoothly and as always everyone has put in far more effort and work than originally agreed. So thanks go to all the partners, in particular:

Phil Adams, Leo Appleton, Iain Baird, Polly Dawes, Regina Ferguson, Pia Krogh, Marie Letzgus, Dominic Marsh, Habby Matharoo, Kate Newell, Sarah Robbins, Paul Stainthorp

Also to Dave Pattern and Bryony Ramsden at Huddersfield.

So did we do what we said we would do

Is there is a statistically significant correlation across a number of universities between library activity data and student attainment?

There answer is a YES!

There is statistically significant relationship between both book loans and e-resources use and student attainment. And this is true across all of the universities in the study that provided data in these areas. In some cases this was more significant than in others, but our statistical testing shows that you can believe what you see when you look at our graphs and charts!

Where we didn’t find a statistical significance was in entries to the library, although it looks like there is a difference between students with a 1st and 3rd, there is not an overall significance. This is not surprising as many of us have group study facilities, lecture theatres, cafes and student services in the library. Therefore a student is as just likely to be entering the library for the above reasons than for studying purposes.

We want to stress here again that we realise THIS IS NOT A CAUSAL RELATIONSHIP!  Other factors make a difference to student achievement, and there are always exceptions to the rule, but we have been able to link use of library resources to academic achievement.

So what is our output?

Firstly we have provided all the partners in the project with short library director reports and are in the process of sending out longer in-depth reports. Regrettably, due to the nature of the content of these reports, we cannot share this data; however, we are in the process of anonymising partners graphs in order to release charts of averaged results for general consumption

Furthermore we are also planning to release the raw data from each partner for others to examine. Data will be released on an Open Data licence at https://library.hud.ac.uk/blogs/projects/lidp/open-data/

Finally, we have been astonished by how much interest there has been in our project. To date we have two articles ready for publication imminently and have another 2 in the pipeline. In addition by the end of October we will have delivered 11 conference papers on the project. All articles and conference presentations are accessibly at: https://library.hud.ac.uk/blogs/projects/lidp/articles-and-conference-papers/

Next steps

Although this project has had a finite goal in proving or disproving the hypothesis, we would now like to go back to the original project which provided the inspiration. This was to seek to engage low/non users of library resources and to raise student achievement by increasing the use of library resources.
This has certainly been a popular theme in questions at the SCONUL and LIBER conferences, so we feel there is a lot of interest in this in the library community. Some of these ideas have also been discussed at the recent Business Librarians Association Conference

There are a number of ways of doing this, some based on business intelligence and others based on targeting staffing resources. However, we firmly believe that although there is a business intelligence string to what we would like to take forward, the real benefits will be achieved by actively engaging with the students to improve their experience. We think this could be covered in a number of ways.

  • Gender and socio-economic background? This came out in questions from library directors at SCONUL and LIBER. We need to re-visit the data to see whether there are any effects of gender, nationality (UK, other European and international could certainly be investigated) and socio-economic background in use and attainment.
  • We need to look into what types of data are needed by library directors, e.g. for the scenario ‘if budget cuts result in less resources, does attainment fall’? The Balanced Scorecard approach could be used for this?
  • We are keen to see if we add value as a library through better use of resources and we have thought of a number of possible scenarios in which we would like to investigate further:
    • Does a student who comes in with high grades leave with high grades? If so why? What do they use that makes them so successful?
    • What if a student comes in with lower grades but achieves a higher grade on graduation after using library resources? What did they do to show this improvement?
    • Quite often students who look to be heading for a 2nd drop to a 3rd in the final part of their course, why is this so?
    • What about high achievers that don’t use our resources? What are they doing in order to be successful and should we be adopting what they do in our resources/literacy skills sessions?
  • We have not investigated VLE use, and it would be interesting to see if this had an effect
  • We have set up meetings with the University of Wollongong (Australia) and Mary Ellen Davis (executive director of ACRL) to discuss the project further. In addition we have had interest from the Netherlands and Denmark for future work surrounding the improvement of student attainment through increased use of resources

In respect to targeting non/low users we would like to achieve the following:

  • Find out what students on selected ‘non-low use’ courses think to understand why students do not engage
  • To check the amount and type of contact subject teams have had with the specific courses to compare library hours to attainment (poor attainment does not reflect negatively on the library support!)
  • Use data already available to see if there is correlation across all years of the courses. We have some interesting data on course year, some courses have no correlation in year one with final grade, but others do. By delving deeper into this we could target our staffing resources more effectively to help students at the point of demand.
    • To target staffing resources
  • Begin profiling by looking at reading lists
    • To target resource allocation
    • Does use of resources + wider reading lead to better attainment – indeed, is this what high achievers actually do?
  • To flesh out themes from the focus groups to identify areas for improvement
    • To target promotion
    • Tutor awareness
    • Inductions etc.
  • Look for a connection between selected courses and internal survey results/NSS results
  • Create a baseline questionnaire or exercise for new students to establish level of info literacy skills
    • Net Generation students tend to overestimate their own skills and then demonstrate poor critical analysis once they get onto resources.
    • Use to inform use of web 2.0 technologies on different cohorts, e.g. health vs. computing
  • Set up new longitudinal focus groups or re-interview groups from last year to check progress of project
  • Use data collected to make informed decisions on stock relocation and use of space
  • Refine data collected and impact of targeted help
  • Use this information to create a toolkit which will offer best practice to a given profile
    • E.g. scenario based

Ultimately our goal will be to help increase student engagement with the library and its resources, which as we can now prove, leads to better attainment. This work would also have an impact on library resources, by helping to target our precious staff resources in the right place at the right time and to make sure that we are spending limited funds on the resources most needed to help improve student attainment.

How can others benefit?

There has been a lot of interest from other universities throughout the project. Some universities may want to take our research as proof in itself and just look at their own data; we have provided instructions on how to do this at https://library.hud.ac.uk/blogs/files/lidp/Documentation/DataRequirements.pdf. We will also make available the recipes written with the Synthesis project in the documentation area of the blog, we will be adding specific recipes for different library management systems in the coming weeks: https://library.hud.ac.uk/blogs/projects/lidp/documentation/

For those libraries that want to do their own statistical analysis, this was a was a complex issue for the project, particularly given the nature of the data we could obtain vs. the nature of the data required to specifically find correlations. As a result, we used the Kruskal Wallis (KW) test, designed to measure whether there are differences between groups of non-normally distributed data. To confirm non-normal distribution, a Kolmogorov-Smirnov test was run. KW unfortunately does not tell us where differences are, the Mann Whitney test was used on specific couplings of degree results, selected based on visual data represented in boxplot graphs. The number of Mann Whitney tests have to be limited as the more tests conducted, the higher the significance value required, so we limited them to three (at a required significance value of 0.0167 (5% divided by 3)). Once Mann Whitney tests had been conducted, effect size of the difference was calculated. All tests other than effect size were run in PASW 18; effect size was calculated manually. It should be noted that we are aware the size of the samples we are dealing with could have indicated relationships where they do not exist, but we feel our visual data demonstrates relationships that are confirmed by the analytics, and thus that we have a stable conclusion in our discarding of the null hypothesis that there is no relationship between library use and degree result.

Full instructions of how the tests were run will first be made available to partner institutions and disseminated publicly through a toolkit in July/August

Lessons we learned during the project

The three major lessons learned were:

Forward planning for the retention of data. Make sure all your internal systems and people are communicating with each other. Do not delete data without first checking that other parts of the University require the data. Often this appears to be based on arbitrary decisions and not on institutional policy. You can only work with what you’re able to get!

Beware e-resources data. We always made it clear that the data we were collecting for e-resource use was questionable, during the project we have found that much of this data is not collected in the same way across an institution, let alone 8! Athens, Shibboleth and EZProxy data may all be handled differently – some may not be collected at all. If others find that there is no significance between e-resources data and attainment, they should dig deeper into their data before accepting the outcome.

Legal issues. For more details on this lesson, see our earlier blog on the legal stuff

Final thoughts

Although this post is labelled the final blog post, we will be back!

We are adding open data in the next few weeks and during August we will be blogging about the themes that have been brought out in the focus groups.

The intention is then to use this blog to talk about specific issues we come across with data etc. as we carry our findings forward. At our recent final project meeting, it was agreed that all 8 partners would continue to do this via the blog.

Finally a huge thank you to Andy McGregor for his support as Programme Manager and to the JISC for funding us.

Talking to Business Librarians at the BLA Conference

We have been out and about disseminating the early findings of the LIDP project over the last few weeks. We have been delighted with the feedback we have received from conference delegates and a lot of the comments about possible future directions for research from the CILIPs, SCONUL and LIIBER conferences have given us food for thought. Many of these comments will appear in the final project blog post before the end of July. However, we had the opportunity at the Business Librarians Association Conference at Sheffield (http://www.bbslg.org/2011Conference.aspx) of testing some of these thoughts. After our presentation (http://eprints.hud.ac.uk/10949/) we divided delegates up into a number of groups to discuss a variety of scenarios.

Scenario 1
If we assume a link between library usage and attainment, what does good practice look like? What are the students who gain a first doing differently to their colleagues who get lower grades? Do high achievers choose ‘better’ resources, or are they ‘better’ at choosing resources?
Two groups reported back on this scenario with the following recommendations:

  • Talk to high achievers to find out what they are doing, e.g.
    • Working
    • Using data effectively
    • Using the right resources
  • Establish what good practice is, e.g. finding, using interpreting
  • Consider the requirements of the subject, for example mathematics courses often require much less resource use than other subjects such as history
  • Qualitative statistics need to be considered in addition to quantitative statistics
    Consider the impact of information literacy and support services
  • Find out the student’s own personal goals, e.g. why are they attending the course – as a work requirement etc.
  • Look at which resources are being used, such as extended reading, not just how much
  • Teach the students evaluation skills to help them find appropriate resources, not just ‘better’

Scenario 2
If students are not using the library or the resources, what can we do to change their behaviour? Is non-use a resourcing issue or an academic/information skills issues? How could gender, culture and socio-economic background affect library usage and how could this be addressed? Are there scenarios where we should NOT try to increase library use?

Groups considered a number of factors that could be used to change behaviour:

  • Incentives
    • Attached to an assignment
    • Work with and win over the academics
    • Encourage student champions
    • Make sure the resources are embedded and relevant to the subject

Regarding non-use, the groups thought that both issues were relevant. The skills issues required further training and the resources needed simplifying.
Gender, culture and socio-economic background were themes brought out at both the SCONUL and LIBER conferences. One group looked at international students where it was considered that they were too dependent on Google – does this means our resources are too difficult to understand? It was also considered that there is a focus on generalisations, e.g. international students, rather than looking at individuals. Another group considered that it was a cultural issue and that students were guided to the ‘right answer’ via reading lists, rather than reading around the subject.
Finally discussion turned to work-life balance and whether students should be logging in at 2am, and whether our culture of 24×7 access was a healthy one.

Scenario 3
Can we actually demonstrate that the library adds value? E.g. if a student enters university with average UUCAS points and attains a first class degree having used library resources to a high level, does this prove the library has added value to the student achievement? Have we done anything? Do they need us?

The short answer to this scenario was yes!
We receive feedback, both internal and external and have provided learning spaces and essential resources at the very least. We can also show that we have promoted our services and embedded information literacy skills into the curriculum by working successfully with academic staff. It was thought that we add to the employability of students by teaching them research skills and giving certification, e.g. Bloomberg etc.

Scenario 4
If the hypothesis is proved to be correct, does cutting library budgets mean that attainment will fall? Is this something that can be used at director level to protect resource budgets/subject librarians? Should we be concerned about implications for publishers if the hypothesis is proven?

The group that looked at this scenario considered that further use of statistics were required to find out what students were reading. This would allow stock to be rationalised and the reduced budget could be used to better target appropriate resources.

In addition it was suggested that other services such as, inductions and information literacy training by audited and evaluated in order to provide more effective targeting.

It was also felt that there was an absolute minimum spend for resources, once this level was passed impact would be huge with insufficient resources to support courses.

The group felt that this could be used at Director level and that evidence would be required to support this.
Big deals came up in the final point from this scenario. Discussion centered on a standoff between the need for better products verses ongoing financial commitments

Many thanks to all the delegates for allowing us to blog about their comments and to the BLA for letting us loose at their conference. We’ll be adding some of these comments to our final blog post.

What will this project do for library users?

The project aims to make some pretty big conclusions by the end of the data analysis about library usage and attainment, but what can we actually do with this information once we’ve got proof?  What use is it to our customers? 

We’ve got two main groups of library users; staff and students.  We aim to use our quantitative data to pinpoint groups of students who have a particular level of attainment.  We’ll work with staff in order to improve poor scores and learn from those who are awarded high scores, regardless of whether they are high or low users of our resources and facilities.  Focus groups held now, and most likely regularly in the future, will tell us more about people who use the library resources less but achieve good degree results.  If the materials we are providing aren’t what students want to use, we can tailor our collections to reflect their needs as well as ensure they get the right kind of information their tutors want them to use.

The student benefits are pretty obvious – the more we can advise and communicate to them and encourage use of library staff, and electronic and paper resources, the more likely they are to get a good degree and get value from their time (and money!) spent at university.  Once again we state here that we are aware of other factors in student attainment, but a degree is not achieved without having some knowledge of the subject, and we help supplement the knowledge communicated by lecturers. 

Students get value for money and hopefully enjoy their university experience, lecturers ensure students get the right kind of support and materials they need, and we make sure our budget is used appropriately.  Pretty good, huh?