Some thoughts on the focus groups from De Montfort University

Given the time of year and the short time scale in which to hold them (just before Easter), we were pleasantly surprised to receive an overwhelming 204 replies to our invitation email for the three LIDP focus groups. The ten pounds print credit incentive must have looked particularly attractive during assignment time, especially when our most generous offering for focus groups so far had not exceeded five pounds.
Expecting less than 50% attendance, we invited twenty students to each focus group and gently let down the rest. Attendance was also better than expected with thirty five students attending in total, twenty six of whom were full time undergraduate students.

Students were on the whole interested, enthusiastic and some questions and comments generated lively discussion around the table especially when talking about mysteriously missing files from the library PCs! There were some insightful comments: summing up a conversation about the limitations of some of the library searching tools and some ways around these, one student remarked ‘it seems that a lot of us use different means to go around the library rather than use the library engines as we can’t find things by using them. We are working around the library not through it’

At first, I was not sure what the focus groups could add to the very neat graphs that Dave has already produced from our quantitative data. Students who attend focus groups are not usually a representative sample, and these groups were no exception. As one student remarked ‘we have all made the effort to come to this focus group, it kind of shows we are in the same mind’ (i.e. motivated and keen to do well).

However, even if for a biased sample of the students’ population, the focus groups did flesh out the story behind the figures. What these students have in common is their active engagement with the academic world, including the library and its resources. Most of them read beyond the recommended reading, use the online resources, borrow books regularly, and are keen to get a good degree. This does not mean that they do not get frustrated by faulty equipment and missing books, of course, but they all showed a willingness to make sense of their academic environments, some even finding ingenious ways around the perceived inadequacies of our systems.

It would be expedient to think that it is our wonderful and expensive resources that make the difference in students’ performance and ultimately their results. But I suspect that a more crucial factor is the depth of the students’ engagement with their studies rather than the intrinsic value of our resources. My guess is that most of the students attending the focus groups will go on to do well in their studies. They will do well because they are keen, and because this motivation is translated into a willingness to try things out and explore the resources and services at their disposal.

The fact that many students comment on the awkwardness of our systems and searching tools (i.e. catalogues and databases) could also have a role to play in explaining the correlation between Athens logins and degree results. Motivated students are more likely to explore the resources that are available to them and also more likely to jump over hurdles and persevere to get to the good stuff. So, could the strong correlation between Athens logins and degree results be as much an indicator of students’ motivation and staying power as it is of the usefulness of our resources? And could the advent of discovery tools like Summons or Ebsco Discovery lessen this correlation? Indeed, if searching for ‘quality’ resources becomes as ‘easy’ as searching Google, will usage of online library resources still be a measure of the difference between the good and the not so good student? Or will the difference only become noticeable further along the way (e.g. how students make sense of the information they find). But if so, will we be able to measure it?

The focus groups also helped to explain the lack of correlation found between usage of the library itself (i.e. the physical space) and degree results. Although most students use the library regularly, there is a very clear division between those students who prefer working in the library and those who prefer working at home. This preference does not appear to be linked to motivation or engagement with their course but to other factors such as personal preferences, distance from the library, the nature of the task undertaken, and the availability of internet access at home. So for those students, using the library as a space is not an indication of how hard they work. Moreover, whilst Athens cannot be used for much else besides studying, there are many more ways in which the library can be used than for studying (e.g. using the PCs for fun, chatting, meeting place).

All in all, the focus groups were a great opportunity to meet some great students, gain a deeper insight into students’ experience of using the library, and generated a lot of interesting qualitative data. It also provided me with much food for thought and speculation!

Marie Letzgus
De Montfort University

2 thoughts on “Some thoughts on the focus groups from De Montfort University”

  1. Hi

    Love your blog.

    “And could the advent of discovery tools like Summons or Ebsco Discovery lessen this correlation? Indeed, if searching for ‘quality’ resources becomes as ‘easy’ as searching Google, will usage of online library resources still be a measure of the difference between the good and the not so good student? Or will the difference only become noticeable further along the way (e.g. how students make sense of the information they find). But if so, will we be able to measure it?”

    Really interesting speculation, silly question would the library impact data project be able to tease this out by comparing institutions without Summon/EDS versus those that don’t have it yet?

  2. Hi Aaron

    Good question!

    The early signs from Huddersfield are that our COUNTER full-text downloads have seen massive increases since fully launching Summon (in August 2010), with many databases seeing a 500% increase in downloads.

    I spoke to an Australian librarian about this an ALA and we discussed what these huge increases actually meant, although I don’t think we came to any conclusion!

    – we often measure value by usage, so higher usage means we’re getting better value for our money
    – because it’s easy to use, are the increases down to the fact that more students using Summon that used MetaLib?
    – because it’s easier to find relevant articles, are students discovering more articles than they did with MetaLib?

    Also, there’s (Calvin) Mooers’ Law (he’s the guy who coined the term “information retrieval”):

    An information retrieval system will tend not to be used whenever it is more painful and troublesome for a customer to have information than for him not to have it. Where an information retrieval system tends not to be used, a more capable information retrieval system may tend to be used even less.

    …giving people easier access to more information sometimes isn’t a “good thing”, as they have to spend more time evaluating and processing the extra information.

    We won’t get chance to do this within the LIDP project, but I’m sure we can dig deeper into Huddersfield data to try and evaluate the impact of Summon.

Comments are closed.